Intracluster Moves for Constrained Discrete-Space MCMC
نویسندگان
چکیده
This paper addresses the problem of sampling from binary distributions with constraints. In particular, it proposes an MCMC method to draw samples from a distribution of the set of all states at a specified distance from some reference state. For example, when the reference state is the vector of zeros, the algorithm can draw samples from a binary distribution with a constraint on the number of active variables, say the number of 1’s. We motivate the need for this algorithm with examples from statistical physics and probabilistic inference. Unlike previous algorithms proposed to sample from binary distributions with these constraints, the new algorithm allows for large moves in state space and tends to propose them such that they are energetically favourable. The algorithm is demonstrated on three Boltzmann machines of varying difficulty: A ferromagnetic Ising model (with positive potentials), a restricted Boltzmann machine with learned Gabor-like filters as potentials, and a challenging three-dimensional spin-glass (with positive and negative potentials).
منابع مشابه
Evaluation of proposal distributions on clock-constrained trees in Bayesian phylogenetic inference
Bayesian Markov chain Monte Carlo (MCMC) has become one of the principle methods of performing phylogenetic inference. Implementing the Markov chain Monte Carlo algorithm requires the definition of a proposal distribution which defines a transition kernel over the state space. The precise form of this transition kernel has a large impact on the computational efficiency of the algorithm. In this...
متن کاملMarkov Chain Monte Carlo Particle Algorithms for Discrete-Time Nonlinear Filtering
This work shows how a carefully designed instrumental distribution can improve the performance of a Markov chain Monte Carlo (MCMC) particle filter for systems with a high state dimension (up to 100). We devise a special subgradient-based kernel from which candidate moves are drawn. This facilitates the implementation of the filtering algorithm in high dimensional settings using a remarkably sm...
متن کاملAn MCMC model search algorithm for regression problems
We improve upon the Carlin and Chib MCMC algorithm that searches in model and parameter space. Our proposed algorithm attempts non-uniformly chosen ‘local’ moves in model space and avoids some pitfalls of other existing algorithms. In a series of examples with linear and logistic regression we report evidence that our proposed algorithm performs better than existing algorithms.
متن کاملAccelerating Mcmc with Active Subspaces
The Markov chain Monte Carlo (MCMC) method is the computational workhorse for Bayesian inverse problems. However, MCMC struggles in high-dimensional parameter spaces, since its iterates must sequentially explore a high-dimensional space for accurate inference. This struggle is compounded in physical applications when the nonlinear forward model is computationally expensive. One approach to acce...
متن کاملDeep Latent Dirichlet Allocation with Topic-Layer-Adaptive Stochastic Gradient Riemannian MCMC
It is challenging to develop stochastic gradient based scalable inference for deep discrete latent variable models (LVMs), due to the difficulties in not only computing the gradients, but also adapting the step sizes to different latent factors and hidden layers. For the Poisson gamma belief network (PGBN), a recently proposed deep discrete LVM, we derive an alternative representation that is r...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010